18 research outputs found

    Measuring Hubble's Constant in our Inhomogeneous Universe

    Get PDF
    Recent observations of Cepheids in the Virgo cluster have bolstered the evidence that supports a Hubble constant in 70-90 km/s/Mpc range. This evidence, by and large, probes the expansion of the Universe within 100 Mpc. We investigate the possibility that the expansion rate within this region is systematically higher than the true expansion rate due to the presence of a local, large underdense region or void. We begin by calculating the expected deviations between the locally measured Hubble constant and the true Hubble constant for a variety of models. We also discuss the expected correlations between these deviations and mass fluctuation for the sample volume. We find that the fluctuations are small for the standard cold dark matter as well as mixed dark matter models but can be substantial in a number of interesting and viable nonstandard scenarios. However, deviations in the Hubble flow for a region of radius 200 Mpc are small for virtually all reasonable models. Therefore, methods based on supernovae or the Sunyaev-Zel'dovich effect, which can probe 200 Mpc scales, will be essential in determining the true Hubble constant. We discuss, in detail, the fluctuations induced in the cosmic background radiation by voids at the last scattering surface. In addition, we discuss the dipole and quadrupole fluctuations one would expect if the void enclosing us is aspherical or if we lie off-center.Comment: 20 pages (58K), 8 Postscript figures (111K compressed); Submitted to MNRAS. Postscript source available at http://astro.queensu.ca/~dursi/preprints

    Local Ignition in Carbon/Oxygen White Dwarfs -- I: One-zone Ignition and Spherical Shock Ignition of Detonations

    Full text link
    The details of ignition of Type Ia supernovae remain fuzzy, despite the importance of this input for any large-scale model of the final explosion. Here, we begin a process of understanding the ignition of these hotspots by examining the burning of one zone of material, and then investigate the ignition of a detonation due to rapid heating at single point. We numerically measure the ignition delay time for onset of burning in mixtures of degenerate material and provide fitting formula for conditions of relevance in the Type Ia problem. Using the neon abundance as a proxy for the white dwarf metallicity, we then find that ignition times can decrease by ~20% with addition of even 5% of neon by mass. When temperature fluctuations that successfully kindle a region are very rare, such a reduction in ignition time can increase the probability of ignition by orders of magnitude. If the neon comes largely at the expense of carbon, a similar increase in the ignition time can occur. We then consider the ignition of a detonation by an explosive energy input in one localized zone, eg a Sedov blast wave leading to a shock-ignited detonation. Building on previous work on curved detonations, we find that surprisingly large inputs of energy are required to successfully launch a detonation, leading to required matchheads of ~4500 detonation thicknesses - tens of centimeters to hundreds of meters - which is orders of magnitude larger than naive considerations might suggest. This is a very difficult constraint to meet for some pictures of a deflagration-to-detonation transition, such as a Zel'dovich gradient mechanism ignition in the distributed burning regime.Comment: 29 pages; accepted to ApJ. Comments welcome at http://www.cita.utoronto.ca/~ljdursi/thisweek/ . Updated version addressing referee comment

    GA4GH: International policies and standards for data sharing across genomic research and healthcare.

    Get PDF
    The Global Alliance for Genomics and Health (GA4GH) aims to accelerate biomedical advances by enabling the responsible sharing of clinical and genomic data through both harmonized data aggregation and federated approaches. The decreasing cost of genomic sequencing (along with other genome-wide molecular assays) and increasing evidence of its clinical utility will soon drive the generation of sequence data from tens of millions of humans, with increasing levels of diversity. In this perspective, we present the GA4GH strategies for addressing the major challenges of this data revolution. We describe the GA4GH organization, which is fueled by the development efforts of eight Work Streams and informed by the needs of 24 Driver Projects and other key stakeholders. We present the GA4GH suite of secure, interoperable technical standards and policy frameworks and review the current status of standards, their relevance to key domains of research and clinical care, and future plans of GA4GH. Broad international participation in building, adopting, and deploying GA4GH standards and frameworks will catalyze an unprecedented effort in data sharing that will be critical to advancing genomic medicine and ensuring that all populations can access its benefits

    CODES AS INSTRUMENTS: COMMUNITY APPLICATION AND SIMULATION SOFTWARE FOR THE HARDWARE ARCHITECTURES OF THE NEXT DECADE

    No full text
    ABSTRACT Modern astronomical research requires increasingly sophisticated computing facilities and software tools. Computational tools have become the fundamental tools to turn observational raw data into scientific insight. Complex multi-physics simulation codes have developed into tools for numerical experiments that provide scientific insight beyond classical theory. Canadian researchers need an environment for developement and maintenance of these critical tools. In particular, the drastically enhanced complexity of deeply heterogeneous hardware architectures poses a real challenge to using present and future HPC facilties. Without a national program in astrophysical simulation science and astronomy application code developement we are becoming vulnerable with respect to our ability to maximise the scientific return from existing and planned investments into atronomy. In addition, there are significant industrial/commercial HQP needs that a simulation and application code program could start to address, if it is properly aligned with academic training opportunities. We outline the framework and requirements for such a framework for developing Canadian astronomical application and simulation codes -and code builders. In the US decadal plan process, voices are calling for similar emphasis on developing infrastructure and incentives for open community codes CODES AS INSTRUMENTS The Canadian computational landscape has changed remarkably since the last LRP, and in most ways for the better; Canadian astronomers now have access to much larger data and computational power, and there is some degree of staff supporting these new large facilities (for an overview, see This white paper is singling out a final missing link and bottle neck that prevents Canadian astronomy and astrophysics to fully take advantage of the potential of the new computer age. This missing link is the underdeveloped and underfunded state of our simulation and application software developement efforts. The role of these activities has internationally long been recognized, e.g. in the NSF report Computation As a Tool for Discovery in Physics 1 that was the result of an expert workshop almost 10 years ago: Simply put, we must move from a mode where we view computational science as an applied branch of theory to a mode where its true resource needs as a distinct research mode are recognized. Concretely, this means providing support for building the infrastructure (software) of computational science at levels commensurate with their true costs, just as we support construction and operation of experimental facilities. We like to employ this metaphor of simulation an

    Prospects of GPU Tensor Core Correlation for the SMA and the ngEHT

    No full text
    Building on the base of the existing telescopes of the Event Horizon Telescope (EHT) and ALMA, the next-generation EHT (ngEHT) aspires to deploy ∼10 more stations. The ngEHT targets an angular resolution of ∼15 microarcseconds. This resolution is achieved using Very Long Baseline Interferometry (VLBI) at the shortest radio wavelengths ∼1 mm. The Submillimeter Array (SMA) is both a standalone radio interferometer and a station of the EHT and will conduct observations together with the new ngEHT stations. The future EHT + ngEHT array requires a dedicated correlator to process massive amounts of data. The current correlator-beamformer (CBF) of the SMA would also benefit from an upgrade, to expand the SMA’s bandwidth and also match the EHT + ngEHT observations. The two correlators share the same basic architecture, so that the development time can be reduced using common technology for both applications. This paper explores the prospects of using Tensor Core Graphics Processing Units (TC GPU) as the primary digital signal processing (DSP) engine. This paper describes the architecture, aspects of the detailed design, and approaches to performance optimization of a CBF using the “FX” approach. We describe some of the benefits and challenges of the TC GPU approach

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples

    Get PDF
    The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that ~80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAF < 15%) and clonal heterogeneity contribute up to 68% of private WGS mutations and 71% of private WES mutations. We observe that ~30% of private WGS mutations trace to mutations identified by a single variant caller in WES consensus efforts. WGS captures both ~50% more variation in exonic regions and un-observed mutations in loci with variable GC-content. Together, our analysis highlights technological divergences between two reproducible somatic variant detection efforts.The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that -80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAFPeer reviewe

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples

    No full text
    The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that ~80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAF < 15%) and clonal heterogeneity contribute up to 68% of private WGS mutations and 71% of private WES mutations. We observe that ~30% of private WGS mutations trace to mutations identified by a single variant caller in WES consensus efforts. WGS captures both ~50% more variation in exonic regions and un-observed mutations in loci with variable GC-content. Together, our analysis highlights technological divergences between two reproducible somatic variant detection efforts

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples

    No full text
    The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that ~80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAF < 15%) and clonal heterogeneity contribute up to 68% of private WGS mutations and 71% of private WES mutations. We observe that ~30% of private WGS mutations trace to mutations identified by a single variant caller in WES consensus efforts. WGS captures both ~50% more variation in exonic regions and un-observed mutations in loci with variable GC-content. Together, our analysis highlights technological divergences between two reproducible somatic variant detection efforts
    corecore